13 research outputs found

    Deriving new measurements for real-time reactive systems

    Get PDF
    Real-time reactive systems are largely event-driven, interact intensively and continuously with the environment through stimulus-response behavior, and are regulated by strict timing constraints. Examples of such systems include alarm systems, air traffic control systems, nuclear reactor control systems and telecommunication systems; applications involving real-time reactive software play a mission-critical role in the defense industry. Real-time reactive systems are inherently complex. The complexity pervades through the different phases of software development, deployment, and maintenance. Applying formal methods in the development process is an effective way for dealing with the complexity, and for quality assurance. One of the goals is to assess the quality of such systems starting from the earlier phases of their life cycle. The integration of the quality measurement into the development framework provides feedback to the system developers in order to effectively control the development processes and to obtain high reliability of a final product. Thus, quality control is a must when safety-critical real-time reactive systems are developed. The quality assessment must be regarded as a support for controlling the process of software development in order to guarantee the final quality. The aim of the thesis is to correctly apply the measurement theory to formal description of real-time software upon which we can base models of object-oriented software measurement. In order to create the framework for the present work, we are surveying the theoretical approaches to software measurement. The novelties of the quality measurement methodology are in the theoretical basis and a practical automated measurement data generation process for real-time reactive systems. The proposed approach is applicable to real-time reactive systems modeled as timed labeled transition systems

    Experimental Study Using Functional Size Measurement in Building Estimation Models for Software Project Size

    Get PDF
    This paper reports on an experiment that investigates the predictability of software project size from software product size. The predictability research problem is analyzed at the stage of early requirements by accounting the size of functional requirements as well as the size of non-functional requirements. The experiment was carried out with 55 graduate students in Computer Science from Concordia University in Canada. In the experiment, a functional size measure and a project size measure were used in building estimation models for sets of web application development projects. The results show that project size is predictable from product size. Further replications of the experiment are, however, planed to obtain more results to confirm or disconfirm our claim

    Reliability model for component-based systems in cosmic (a case study)

    Get PDF
    Software component technology has a substantial impact on modern IT evolution. The benefits of this technology, such as reusability, complexity management, time and effort reduction, and increased productivity, have been key drivers of its adoption by industry. One of the main issues in building component-based systems is the reliability of the composed functionality of the assembled components. This paper proposes a reliability assessment model based on the architectural configuration of a component-based system and the reliability of the individual components, which is usage- or testing-independent. The goal of this research is to improve the reliability assessment process for large software component-based systems over time, and to compare alternative component-based system design solutions prior to implementation. The novelty of the proposed reliability assessment model lies in the evaluation of the component reliability from its behavior specifications, and of the system reliability from its topology; the reliability assessment is performed in the context of the implementation-independent ISO/IEC 19761:2003 International Standard on the COSMIC method chosen to provide the component\u27s behavior specifications. In essence, each component of the system is modeled by a discrete time Markov chain behavior based on its behavior specifications with extended-state machines. Then, a probabilistic analysis by means of Markov chains is performed to analyze any uncertainty in the component\u27s behavior. Our hypothesis states that the less uncertainty there is in the component\u27s behavior, the greater the reliability of the component. The system reliability assessment is derived from a typical component-based system architecture with composite reliability structures, which may include the composition of the serial reliability structures, the parallel reliability structures and the p-out-of-n reliability structures. The approach of assessing component-based system reliability in the COSMIC context is illustrated with the railroad crossing case study. © 2008 World Scientific Publishing Company

    Assessment of real-time software specifications quality using COSMIC-FFP

    Get PDF
    The success of a system development project largely depends on the non ambiguity of its system-level requirements specification document where the requirements are described at the \u27system level\u27 and not at the software and hardware level, and which document serves as an input to the design, implementation and testing phases. The way the system requirements specification document is written is sometimes ambiguous from a software viewpoint. This paper approaches the problem faced by the codesign, namely, the ill-defined functionality allocation between the hardware and the software for real time systems and presents an initial solution to it. It discusses what are the system requirements to be assigned to the hardware and what is really to be done by the software? Different decisions can lead then to various alternatives of allocation of functions between hardware and software: this will affect what software will be built and, correspondingly, the final functional size of software while measuring it using COS-MIC-FFP measurement method. This paper presents an initial solution towards understanding the applicability of the COSMIC-FFP functional size measurement method in assessing the hardware-software requirements allocation, and illustrates the approach on a Steam Boiler Controller case study. © 2008 Springer-Verlag Berlin Heidelberg

    Techniques for quantitative analysis of software quality throughout the SDLC: The SWEBOK guide coverage

    Get PDF
    This paper presents an overview of quantitative analysis techniques for software quality and their applicability during the software development life cycle (SDLC). This includes the Seven Basic Tools of Quality, Statistical Process Control, and Six Sigma, and it highlights how these techniques can be used for managing and controlling the quality of software during specification, design, implementation, testing, and maintenance. We verify whether or not these techniques, which are generally accepted for most projects, most of the time, and have value that is recognized by the peer community, have indeed been included in the SWEBOK Guide. © 2010 IEEE

    Managing Requirement Volatility in an Ontology-Driven Clinical LIMS Using Category Theory. International Journal of Telemedicine and Applications

    Get PDF
    Requirement volatility is an issue in software engineering in general, and in Web-based clinical applications in particular, which often originates from an incomplete knowledge of the domain of interest. With advances in the health science, many features and functionalities need to be added to, or removed from, existing software applications in the biomedical domain. At the same time, the increasing complexity of biomedical systems makes them more difficult to understand, and consequently it is more difficult to define their requirements, which contributes considerably to their volatility. In this paper, we present a novel agent-based approach for analyzing and managing volatile and dynamic requirements in an ontology-driven laboratory information management system (LIMS) designed for Web-based case reporting in medical mycology. The proposed framework is empowered with ontologies and formalized using category theory to provide a deep and common understanding of the functional and nonfunctional requirement hierarchies and their interrelations, and to trace the effects of a change on the conceptual framework.Comment: 36 Pages, 16 Figure

    A case-study using the COSMIC-FFP measurement method for assessing real-time systems specifications

    Get PDF
    The success of a system development project largely depends on the nonambiguity of its system-level requirements specification document, where the requirements are described at the system level rather than at the software and hardware level. There may be missing details about the allocation of functions between hardware and software, both for the developers who will have to implement such requirements later on, and for the software measurers who have to immediately attempt to measure the software functional size of such requirements.. The result of different interpretations of the specification problem would lead to different software being built, and of different functional size. The research described in this paper is concerned with the challenges inherent in understanding the initial system requirements in textual form and assessing the codesign decisions using the functional size measurement. This paper aimed at understanding the applicability of the COSMIC-FFP functional size measurement method in assessing the hardware-software requirements allocation, and illustrates the approach on a Steam Boiler Controller case study
    corecore